Nature-inspired heuristics
by daniel-hromada
()
@ Design & Computation - Perspectives in Engineering - Studio @ Old TU Berlin building

Design & Computation - Perspectives in Engineering - Studio @ Old TU Berlin building

Simulated annealing

Simulated Annealing is a technique for finding good solutions to tough problems. It's like trying different temperatures to shape a metal perfectly. At first, it allows big changes (high temperature), even if they seem wrong, to explore different options. Gradually, it cools down (lowers the temperature), making smaller, more careful adjustments. This process helps avoid getting stuck with a not-so-great solution (local optimum) and aims for a better one, although it might not always be the best possible (global optimum).

Ant colony optimization

Ant Colony Optimization is inspired by how real ants find the shortest paths to food. In this method, virtual ants roam through possible solutions, leaving pheromone trails. Stronger trails attract more ants, suggesting better solutions. Over time, the best paths have more pheromones, guiding ants towards them. It's great for complex problems like routing or scheduling, where finding the best route matters. It's like following a trail to the best answer based on the collective wisdom of many trials.

Evolutionary optimization

Evolutionary Optimization mimics natural selection, like how animals evolve. Imagine a population of potential solutions. Those fitting the problem best (like the fittest animals) are chosen to 'reproduce.' They mix and mutate to create new solutions or 'offspring.' Over generations, this process 'evolves' better solutions, as unfit ones are discarded. It's a trial-and-error method using principles of evolution, effectively finding good solutions for complex problems by simulating survival of the fittest in a virtual environment.

Genetic Algorithm

A Genetic Algorithm is a method in evolutionary optimization that solves problems by mimicking natural evolution. Imagine a survival contest where each participant (solution) has traits (parameters). These solutions breed and mutate, creating new generations. The fittest solutions, judged by a fitness function, survive to breed again. Over time, this process 'evolves' increasingly effective solutions. It's like nature's trial-and-error but used for complex problems like route planning, where finding the best or a good-enough solution is essential.

Selection

In evolutionary optimization, selection is like a survival test for candidate solutions, deciding which ones get to 'reproduce.' Selection operators are the rules determining who passes this test. They might choose the fittest solutions (those solving the problem best) or sometimes include random or less fit ones for diversity. 

Fitness function

In evolutionary optimization, a "fitness function" is like a scoring system that rates how good each candidate solution (or 'individual') is at solving the problem. Think of it as a judge in a competition, evaluating participants based on how well they meet certain criteria. The higher the score, the better the solution. This function is crucial because it guides the selection process, helping to decide which solutions should be kept, discarded, or combined to create the next generation of solutions.

Selection operators

elitism: select N most fit individuals and copy them to next generation

roulette-wheel: probability of survival into next generation is proportional to indvididual's fitness

tournament...

Variation

In evolutionary optimization, variation is the process of introducing diversity into the population of solutions. Like genetic mutations and breeding in nature, it involves altering the 'genes' (parameters) of candidate solutions to create new, different ones. This can be done through mutation (changing some parameters) or crossover (mixing parameters from two solutions). Variation is crucial for exploring new solutions and avoiding getting stuck with suboptimal ones, much like how biological diversity is key to the survival and evolution of species.

Crossover

Crossover

Mutation

for numeric genes: additive mutation, multiplicative mutation, complex (imaginary) mutation

for symbolic ones: removal, addition or replacement of a symbol; metathesis (e.g. MORF -> FORM)

NOTE: Mutation rates often fall in the range of 0.5% to 1% per gene. This rate is low enough to prevent excessive random search (which can disrupt good solutions) and high enough to introduce diversity and enable the algorithm to explore new areas of the solution space.

Replication

In evolutionary optimization, replication is like making copies of the best solutions. Imagine a survival contest where top performers are cloned. These copies then undergo changes (mutations) or combine features (crossover) to create new solutions. Replication ensures good traits are passed on, increasing chances that future generations will perform even better. It's used in algorithms to solve complex problems, where keeping and tweaking successful solutions gradually leads to finding the best or a very good answer.

Population

Population is a set of individuals.

Individual|Genotype|Chromosome

In evolutionary optimization, an "individual," also termed a "genotype" or "chromosome," is a candidate solution to a problem. Think of it like a recipe where each ingredient is a parameter of the solution. These parameters are the "genes" of the individual. Together, they define how the solution behaves or performs. Like in natural evolution, these individuals can be altered (mutated) or combined (through crossover) to create new solutions, in the quest to find the best or most effective answer.

Genetic programming

Genetic Programming (GP) is a type of evolutionary optimization where programs themselves evolve to solve problems. Imagine a computer automatically writing and modifying its own code to get better at a task. In GP, each 'individual' is a computer program. These programs are tested for their ability to solve a problem, and the best ones are modified (mutated) or combined (crossed over) to create new programs. Over generations, this process evolves programs that become increasingly effective at the task.

Grammatical evolution

Grammar Evolution is a type of evolutionary optimization where solutions are generated using a predefined set of rules, like a grammar in language. Imagine creating sentences using grammar rules, but here, the 'sentences' are computer programs or formulas. Each solution must follow these rules, ensuring they make sense. Like in genetic programming, these solutions evolve over generations, becoming better at solving a problem. It's particularly useful when solutions need a specific structure or format, allowing for complex, yet orderly, evolution.

Caching

Caching is a technique used in computing to store frequently accessed data in a readily available location for quick retrieval. It's like having a small, fast memory space for storing the most commonly used information, reducing the need to repeatedly access slower storage areas. In problem-solving, caching is crucial for enhancing performance and efficiency. By remembering previously computed results, systems can avoid redoing complex calculations, significantly speeding up the process of solving repetitive or similar problems. This approach is widely used in software development, web browsing, and data processing.